Generalized boundary adaptation rule for minimizing rth power law distortion in high resolution quantization

نویسندگان

  • Dominique Martinez
  • Marc M. Van Hulle
چکیده

A new generalized unsupervised competitive learning rule is introduced for adaptive scalar quantization. The rule, called generalized Boundary Adaptation Rule (BARr), minimizes r-th power law distortion Dr in the high resolution case. It is shown by simulations that a fast version of BARr outperforms generalized Lloyd I in minimizing D1 (mean absolute error) and D2 (mean squared error) distortion with substantially less iterations. In addition, since BARr does not require generalized centroid estimation, as in Lloyd I, it is much simpler to implement.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized Boundary Adaptation Rule for minimizingr - th power law distortion in case of high resolution quantization

A new generalized unsupervised competitive learning rule is introduced for adaptive scalar quan-tization. The rule, called generalized Boundary Adaptation Rule (BAR r), minimizes r-th power law distortion D r in the high resolution case. It is shown by simulations that a fast version of BAR r outperforms generalized Lloyd I in minimizing D 1 (mean absolute error) and D 2 (mean squared error) di...

متن کامل

A Ziv - Zakai - Rényi Lower Bound on Distortion at High Resolution

We follow a method introduced by Ziv and Zakai for finding ‘informational’ lower bounds on delay constrained joint source-channel coding. Their method uses the data processing theorem for generalized measures of information. We introduce the use of Rényi’s information of order α in their framework, and use high-resolution approximations to find its rate distortion function for a source that pos...

متن کامل

High-resolution scalar quantization with Rényi entropy constraint

We consider optimal scalar quantization with rth power distortion and constrained Rényi entropy of order α. For sources with absolutely continuous distributions the high rate asymptotics of the quantizer distortion has long been known for α = 0 (fixed-rate quantization) and α = 1 (entropyconstrained quantization). These results have recently been extended to quantization with Rényi entropy cons...

متن کامل

A generalized interpolative vector quantization method for jointly optimal quantization, interpolation, and binarization of text images

This paper presents an approach for the effective combination of interpolation with binarization of gray level text images to reconstruct a high resolution binary image from a lower resolution gray level one. We study two nonlinear interpolative techniques for text image interpolation. These nonlinear interpolation methods map quantized low dimensional 2 x 2 image blocks to higher dimensional 4...

متن کامل

Rate-Distortion Bounds for High-Resolution Vector Quantization via Gibbs's Inequality

Gibbs’s inequality states that the differential entropy of a random variable with probability density function (pdf) f is less than or equal to its cross entropy with any other pdf g defined on the same alphabet, i.e., h(X) ≤ −E[log g(X)]. Using this inequality with a cleverly chosen g, we derive a lower bound on the smallest output entropy that can be achieved by quantizing a d-dimensional sou...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural Networks

دوره 8  شماره 

صفحات  -

تاریخ انتشار 1995